Multivariate f-divergence Estimation With Confidence

نویسندگان

  • Kevin R. Moon
  • Alfred O. Hero
چکیده

The problem of f -divergence estimation is important in the fields of machine learning, information theory, and statistics. While several nonparametric divergence estimators exist, relatively few have known convergence properties. In particular, even for those estimators whose MSE convergence rates are known, the asymptotic distributions are unknown. We establish the asymptotic normality of a recently proposed ensemble estimator of f -divergence between two distributions from a finite number of samples. This estimator has MSE convergence rate of O ( 1 T ) , is simple to implement, and performs well in high dimensions. This theory enables us to perform divergence-based inference tasks such as testing equality of pairs of distributions based on empirical samples. We experimentally validate our theoretical results and, as an illustration, use them to empirically bound the best achievable classification error.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Divergence times and morphological evolution of the subtribe Eritrichiinae (Boraginaceae-Rochelieae) with special reference to Lappula

The subtribe Eritrichiinae belongs to tribe Rochelieae (Borginaceae; Cynoglossoideae) which is composed of about 200 species in five genera including Eritrichium, Lappula, Hackelia, Lepechiniella, and Rochelia. The majority of the species are annual and grow in xeric habitats. The genus Lappula as an arid adapted and the second biggest genus...

متن کامل

Robust Estimation in Linear Regression Model: the Density Power Divergence Approach

The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...

متن کامل

Model Confidence Set Based on Kullback-Leibler Divergence Distance

Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...

متن کامل

Penalized Bregman Divergence Estimation via Coordinate Descent

Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...

متن کامل

Factors Related to Cultural Divergence in the Social Values among Two Generations of Families in Tehran

Cultural divergence is incongruity in the values, attitudes and behaviors among different generations of a society. Identifying factors related to cultural divergence in the social values ​​of parents and children is one of the most important issues in Iranian society, which has been investigated in the present research through a quantitative approach. The statistical population of the study co...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014